Posterior contraction and testing for multivariate isotonic regression

نویسندگان

چکیده

We consider the nonparametric regression problem with multiple predictors and an additive error, where function is assumed to be coordinatewise nondecreasing. propose a Bayesian approach make inferences on multivariate monotone function, obtain posterior contraction rate, construct universally consistent testing procedure for monotonicity. To facilitate analysis, we temporarily set aside shape restrictions, endow prior blockwise constant functions independently normally distributed heights. The unknown variance of error term either estimated by marginal maximum likelihood estimate, or equipped inverse-gamma prior. Then unrestricted block-heights are posteriori also given variance, conjugacy. comply project samples from onto class functions, inducing “projection-posterior distribution”, used making inference. Under L1-metric, show that projection-posterior based n independent contracts around true at optimal rate n−1∕(2+d). test monotonicity probability shrinking neighborhood functions. consistent, is, level goes zero, power any fixed alternative one. Moreover, smooth one as long its distance least order estimation function. best our knowledge, no other available in frequentist literature.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Algorithms for L∞ Isotonic Regression

This paper gives algorithms for determining L∞ weighted isotonic regressions satisfying order constraints given by a DAG with n vertices and m edges. Throughout, topological sorting plays an important role. A modification to an algorithm of Kaufman and Tamir gives an algorithm taking Θ(m log n) time for the general case, improving upon theirs when the graph is sparse. When the regression values...

متن کامل

Inference for Multiple Isotonic Regression

The isotonic regression for two or more independent variables is a classic problem in data analysis. The classical solution involves enumeration of upper sets, which is computationally prohibitive unless the sample size is small. Here it is shown that the solution may be obtained through a single projection onto a convex polyhedral cone. The cone formulation allows an exact test of the null hyp...

متن کامل

Venn predictors and isotonic regression

This note introduces Venn–Abers predictors, a new class of Venn predictors based on the idea of isotonic regression. As all Venn predictors, Venn–Abers predictors are well calibrated under the exchangeability assumption.

متن کامل

Bayesian isotonic density regression.

Density regression models allow the conditional distribution of the response given predictors to change flexibly over the predictor space. Such models are much more flexible than nonparametric mean regression models with nonparametric residual distributions, and are well supported in many applications. A rich variety of Bayesian methods have been proposed for density regression, but it is not c...

متن کامل

Optimal Reduced Isotonic Regression

Isotonic regression is a shape-constrained nonparametric regression in which the ordinate is a nondecreasing function of the abscissa. The regression outcome is an increasing step function. For an initial set of n points, the number of steps in the isotonic regression, m, may be as large as n. As a result, the full isotonic regression has been criticized as overfitting the data or making the re...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Electronic Journal of Statistics

سال: 2023

ISSN: ['1935-7524']

DOI: https://doi.org/10.1214/23-ejs2115